A Model Context Protocol (MCP) service that provides access to Ansible Automation Platform (AAP) APIs through OpenAPI specifications.
Lighweight CLI to interact with MCP servers
mcp-cli is a lightweight CLI that enables dynamic discovery of MCP servers, reducing token consumption and making tool interactions more efficient for AI coding agents.
This article details how to build powerful, local AI automations using n8n, the Model Context Protocol (MCP), and Ollama, aiming to replace fragile scripts and expensive cloud-based APIs. These tools work together to automate tasks like log triage, data quality monitoring, dataset labeling, research brief updates, incident postmortems, contract review, and code review – all while keeping data and processing local for enhanced control and efficiency.
**Key Points:**
* **Local Focus:** The system prioritizes running LLMs locally for speed, cost-effectiveness, and data privacy.
* **Component Roles:** n8n orchestrates workflows, MCP constrains tool usage, and Ollama provides reasoning capabilities.
* **Automation Examples:** The article showcases several practical automation examples across various domains, from DevOps to legal compliance.
* **Controlled Access:** MCP limits the model's access to only necessary tools and data, enhancing security and reliability.
* **Closed-Loop Systems:** Many automations incorporate feedback loops for continuous improvement and reduced human intervention.
MCP-native command line interface for Z.AI capabilities: vision analysis, web search, web reader, and GitHub repo exploration.
A tutorial showing how to use the MCP framework with EyelevelAI's GroundX to build a Retrieval-Augmented Generation (RAG) system for complex documents, including setup of a local MCP server, creation of ingestion and search tools, and integration with the Cursor IDE.
This article details how to build a 100% local MCP (Model Context Protocol) client using LlamaIndex, Ollama, and LightningAI. It provides a code walkthrough and explanation of the process, including setting up an SQLite MCP server and a locally served LLM.
An extensible Model Context Protocol (MCP) server that provides intelligent semantic code search for AI assistants. Built with local AI models using Matryoshka Representation Learning (MRL) for flexible embedding dimensions.
Anthropic is donating the Model Context Protocol (MCP) to the Agentic AI Foundation (AAIF), a directed fund under the Linux Foundation, to foster open and collaborative development of agentic AI.
Over the last year, MCP accomplished a rapid rise to popularity that few other standards or technologies have achieved so quickly. This article details the unlikely rise of the Model Context Protocol (MCP) and its journey to becoming a generally accepted standard for AI connectivity.